Description: 自适应(Adaptive)神经网络源程序
The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring
different adaptation algorithms.~..~
There are 11 blocks that implement basically these 5 kinds of neural networks:
1) Adaptive Linear Network (ADALINE)
2) Multilayer Layer Perceptron with Extended Backpropagation algorithm (EBPA)
3) Radial Basis Functions (RBF) Networks
4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN)
5) RBF and Piecewise Linear Networks with Dynamic Cell Structure (DCS) algorithm
A simulink example regarding the approximation of a scalar nonlinear function of 4 variables -Adaptive (Adaptive) The neural network source adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring different adaptation algorithms .~..~ There are 11 blocks that implement basically these five kinds of neural networks : a) Adaptive Linear Network (ADALINE) 2) 102206 with Multilayer Layer Extended Backpropagation algorithm (EBPA) 3) Radial Basis Functions (RBF) Networks, 4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN) 5) RBF and Piecewise Linear Dynamic Networks with the Cell Structure (DCS) algorithm A Simulink example regarding the approximation of a scalar nonlinear function of four variables Platform: |
Size: 200530 |
Author:周志连 |
Hits:
Description: The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring
different adaptation algorithms.~..~
There are 11 blocks that implement basically these 5 kinds of neural networks:
1) Adaptive Linear Network (ADALINE)
2) Multilayer Layer Perceptron with Extended Backpropagation algorithm (EBPA)
3) Radial Basis Functions (RBF) Networks
4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN)
5) RBF and Piecewise Linear Networks with Dynamic Cell Structure (DCS) algorithm
A simulink example regarding the approximation of a scalar nonlinear function of 4 variables is included-The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring different adaptation algorithms .~..~ There are 11 blocks that implement basically these five kinds of neural networks : a) Adaptive Linear Network (ADALINE) 2) Multilayer Layer 102206 with Extended Backpropagation algorithm (EBPA) 3) Radial Basis Functions (RBF) Networks, 4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN) 5) and RBF Networks with Piecewise Linear Dynamic Cell Structure (DCS) algorithm A Simulink example regarding the approximation of a scalar nonlinear function of four variables is included Platform: |
Size: 198792 |
Author:叶建槐 |
Hits:
Description: 我做的一个用matlab程序编写的BP算法,是是我毕设的一部分,程序运行绝对没有问题,欢迎大家指正。-I do a Matlab programming with the back-propagation algorithm, is part of a complete, running absolutely no problem, we are happy to correct. Platform: |
Size: 3072 |
Author:FX |
Hits:
Description: 自适应(Adaptive)神经网络源程序
The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring
different adaptation algorithms.~..~
There are 11 blocks that implement basically these 5 kinds of neural networks:
1) Adaptive Linear Network (ADALINE)
2) Multilayer Layer Perceptron with Extended Backpropagation algorithm (EBPA)
3) Radial Basis Functions (RBF) Networks
4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN)
5) RBF and Piecewise Linear Networks with Dynamic Cell Structure (DCS) algorithm
A simulink example regarding the approximation of a scalar nonlinear function of 4 variables -Adaptive (Adaptive) The neural network source adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring different adaptation algorithms .~..~ There are 11 blocks that implement basically these five kinds of neural networks : a) Adaptive Linear Network (ADALINE) 2) 102206 with Multilayer Layer Extended Backpropagation algorithm (EBPA) 3) Radial Basis Functions (RBF) Networks, 4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN) 5) RBF and Piecewise Linear Dynamic Networks with the Cell Structure (DCS) algorithm A Simulink example regarding the approximation of a scalar nonlinear function of four variables Platform: |
Size: 200704 |
Author:周志连 |
Hits:
Description: The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring
different adaptation algorithms.~..~
There are 11 blocks that implement basically these 5 kinds of neural networks:
1) Adaptive Linear Network (ADALINE)
2) Multilayer Layer Perceptron with Extended Backpropagation algorithm (EBPA)
3) Radial Basis Functions (RBF) Networks
4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN)
5) RBF and Piecewise Linear Networks with Dynamic Cell Structure (DCS) algorithm
A simulink example regarding the approximation of a scalar nonlinear function of 4 variables is included-The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring different adaptation algorithms .~..~ There are 11 blocks that implement basically these five kinds of neural networks : a) Adaptive Linear Network (ADALINE) 2) Multilayer Layer 102206 with Extended Backpropagation algorithm (EBPA) 3) Radial Basis Functions (RBF) Networks, 4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN) 5) and RBF Networks with Piecewise Linear Dynamic Cell Structure (DCS) algorithm A Simulink example regarding the approximation of a scalar nonlinear function of four variables is included Platform: |
Size: 198656 |
Author:叶建槐 |
Hits:
Description: 以类库的形式实现了BP神经网络算法。可从该类基础上派生新类,实现各类应用-In order to realize a form of class libraries BP neural network algorithm. Can be derived from the basis of such new categories, the realization of a wide range of applications Platform: |
Size: 5120 |
Author:唐述 |
Hits:
Description: Batch version of the back-propagation algorithm.
% Given a set of corresponding input-output pairs and an initial network
% [W1,W2,critvec,iter]=batbp(NetDef,W1,W2,PHI,Y,trparms) trains the
% network with backpropagation.
%
% The activation functions must be either linear or tanh. The network
% architecture is defined by the matrix NetDef consisting of two
% rows. The first row specifies the hidden layer while the second
% specifies the output layer.
%-Batch version of the back-propagation algorithm. Given a set of corresponding input-output pairs and an initial network [W1, W2, critvec, iter] = batbp (NetDef, W1, W2, PHI, Y, trparms) trains the network with backpropagation. The activation functions must be either linear or tanh. The network architecture is defined by the matrix NetDef consisting of two rows. The first row specifies the hidden layer while the second specifies the output layer. Platform: |
Size: 2048 |
Author:张镇 |
Hits:
Description: A MLP code with backpropagation training algorithm designed for aproximation of functions problems. Platform: |
Size: 1024 |
Author:Paulo |
Hits:
Description: Backpropagation
Backpropagation is a supervised learning algorithm and is mainly used by Multi-Layer-Perceptrons to change the weights connected to the net s hidden neuron layer(s).
The backpropagation algorithm uses a computed output error to change the weight values in backward direction. Platform: |
Size: 237568 |
Author:muchizmo |
Hits:
Description: this to solve the problem of xnor using backpropagation algorithm-this is to solve the problem of xnor using backpropagation algorithm Platform: |
Size: 1024 |
Author:Aead Amer |
Hits:
Description: The main contribution of this paper is using
optimal control theory for improving the convergence
rate of backpropagation algorithm. In the proposed
approach, the learning algorithm of backpropagation
is modeled as a minimum time control problem
in which the step-size of its learning factor is considered
as the input of this model. In contrast to the traditional
backpropagation, learning algorithms which
the step-size by trial and error, it is selected
adaptively based on optimal control criterion. The effectiveness
of the proposed algorithm is uated in
two simulations: XOR and 3-bit parity. In both simulation
examples, the proposed algorithm outperforms
well in speed and the ability to escape local minima.-The main contribution of this paper is using
optimal control theory for improving the convergence
rate of backpropagation algorithm. In the proposed
approach, the learning algorithm of backpropagation
is modeled as a minimum time control problem
in which the step-size of its learning factor is considered
as the input of this model. In contrast to the traditional
backpropagation, learning algorithms which
the step-size by trial and error, it is selected
adaptively based on optimal control criterion. The effectiveness
of the proposed algorithm is uated in
two simulations: XOR and 3-bit parity. In both simulation
examples, the proposed algorithm outperforms
well in speed and the ability to escape local minima. Platform: |
Size: 415744 |
Author:samir |
Hits: